Bayesian case-deletion model complexity and information criterion
نویسندگان
چکیده
منابع مشابه
Bayesian Model Averaging and Bayesian Predictive Information Criterion for Model Selection
The problem of evaluating the goodness of the predictive distributions developed by the Bayesian model averaging approach is investigated. Considering the maximization of the posterior mean of the expected log-likelihood of the predictive distributions (Ando (2007a)), we develop the Bayesian predictive information criterion (BPIC). According to the numerical examples, we show that the posterior...
متن کاملThe Bayesian Information Criterion 23 Model Selection in Censored Survival
We investigate the Bayesian Information Criterion (BIC) for variable selection in models for censored survival data. Kass and Wasserman (1995) showed that BIC provides a close approximation to the Bayes factor when a unit-information prior on the parameter space is used. We propose a revision of the penalty term in BIC so that it is de ned in terms of the number of uncensored events instead of ...
متن کاملThe Application of Bayesian Information Criterion in Acoustic Model Refinement
Automatic speech recognition (ASR) systems usually consist of an acoustic model and a language model. This paper describes a technique of an efficient deployment of the acoustic model parameters. The acoustic model typically utilizes Continuous Density Hidden Markov Models (CDHMM). The output probability of a particular CDHMM state is represented by a Gaussian mixture density with a diagonal co...
متن کاملAkaike's Information Criterion and Recent Developments in Information Complexity.
In this paper we briefly study the basic idea of Akaike's (1973) information criterion (AIC). Then, we present some recent developments on a new entropic or information complexity (ICOMP) criterion of Bozdogan (1988a, 1988b, 1990, 1994d, 1996, 1998a, 1998b) for model selection. A rationale for ICOMP as a model selection criterion is that it combines a badness-of-fit term (such as minus twice th...
متن کاملA widely applicable Bayesian information criterion
A statistical model or a learning machine is called regular if the map taking a parameter to a probability distribution is one-to-one and if its Fisher information matrix is always positive definite. If otherwise, it is called singular. In regular statistical models, the Bayes free energy, which is defined by the minus logarithm of Bayes marginal likelihood, can be asymptotically approximated b...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Statistics and Its Interface
سال: 2014
ISSN: 1938-7989,1938-7997
DOI: 10.4310/sii.2014.v7.n4.a9